AI Tools Present New Risks in 2024 US Election

2023-05-18

00:00 / 00:00
复读宝 RABC v8.0beta 复读机按钮使用说明
播放/暂停
停止
播放时:倒退3秒/复读时:回退AB段
播放时:快进3秒/复读时:前进AB段
拖动:改变速度/点击:恢复正常速度1.0
拖动改变复读暂停时间
点击:复读最近5秒/拖动:改变复读次数
设置A点
设置B点
取消复读并清除AB点
播放一行
停止播放
后退一行
前进一行
复读一行
复读多行
变速复读一行
变速复读多行
LRC
TXT
大字
小字
滚动
全页
1
  • The risk of misinformation influencing voters has long existed during elections around the world.
  • 2
  • But political experts warn that new artificial intelligence (AI) tools may create even more problems during the 2024 American presidential election.
  • 3
  • Newly released AI tools are widely available at no cost to the public.
  • 4
  • Such tools can be used to present voters with false, or fake, information in an effort to help candidates, experts say.
  • 5
  • Misinformation can take many forms, including fake news stories that seem real.
  • 6
  • Voters can also be targeted with high-quality video or audio that appear to show real candidates, but which were actually created by machines.
  • 7
  • Today, AI systems can quickly produce fake voices, images and videos in seconds.
  • 8
  • And they can be created for little or no cost to the users.
  • 9
  • The tools, described as "generative AI," have received wide attention since the launch of ChatGPT late last year.
  • 10
  • Technology experts say these kinds of AI tools can be used to invent false messages in a series of different forms.
  • 11
  • The material can then be published to social media services or fake online news websites.
  • 12
  • Such material can be effective because it can target specific groups of people.
  • 13
  • "We're not prepared for this," said A.J. Nash, vice president of intelligence at the internet security company ZeroFox.
  • 14
  • "To me, the big leap forward is the audio and video capabilities that have emerged," Nash told The Associated Press.
  • 15
  • AI experts have identified possible situations in which AI could be used to trick the voting public.
  • 16
  • Among these are AI generated phone messages that seem to be in a candidate's own voice.
  • 17
  • Such messages could misinform voters about the correct voting date.
  • 18
  • Fake voice recordings could also be created to show a candidate admitting to a crime or expressing racist opinions.
  • 19
  • Video could also appear to show candidates giving a speech or comments they never gave.
  • 20
  • False images could be designed to look like local news reports that claim a candidate dropped out of the race.
  • 21
  • Former President Donald Trump, who is running for president in 2024, has shared AI-generated content with his followers.
  • 22
  • Recently, Trump shared a manipulated video of CNN news presenter Anderson Cooper on his Truth Social service.
  • 23
  • Another example of what AI tools can do is a campaign advertisement released last month by the Republican National Committee (RNC).
  • 24
  • The online ad came after Democratic President Joe Biden officially announced his reelection campaign.
  • 25
  • The ad began with an image of Biden and the words, "What if the weakest president we've ever had was re-elected?" The ad then showed a series of AI-created images.
  • 26
  • These included Taiwan under attack, closed businesses in the U.S. and soldiers and military vehicles driving down streets.
  • 27
  • RNC officials confirmed the use of AI in the ad. But other individuals or groups will likely develop and release such material in secret, said Petko Stoyanov.
  • 28
  • He is chief technology officer at Forcepoint, an internet security company in Austin, Texas.
  • 29
  • Stoyanov predicted groups aiming to interfere with U.S. democracy will employ AI tools and fake media material to reduce the public's trust.
  • 30
  • He gave the example of international organizations, overseas criminals or states using AI tools to impersonate an American politician.
  • 31
  • In such cases, Stoyanov said, he is unsure U.S. officials or law enforcement could do anything about it.
  • 32
  • Legislation has been introduced in the U.S. House of Representatives to require candidates to identify campaign ads created with AI.
  • 33
  • Some states have offered their own proposals aimed at reducing the use of fake AI-generated content.
  • 34
  • AI is not necessarily needed, however, to create false or misleading audio, video or written materials.
  • 35
  • However, New York Representative Yvette Clarke said her biggest fear is that AI tools could be used before the 2024 election to create material that incites violence and turns Americans against each other.
  • 36
  • "It's important that we keep up with the technology," Clarke told the AP.
  • 37
  • "People are busy with their lives, and they don't have the time to check every piece of information. AI being weaponized, in a political season, it could be extremely disruptive," she said.
  • 38
  • I'm Bryan Lynn.
  • 1
  • The risk of misinformation influencing voters has long existed during elections around the world. But political experts warn that new artificial intelligence (AI) tools may create even more problems during the 2024 American presidential election.
  • 2
  • Newly released AI tools are widely available at no cost to the public. Such tools can be used to present voters with false, or fake, information in an effort to help candidates, experts say.
  • 3
  • Misinformation can take many forms, including fake news stories that seem real. Voters can also be targeted with high-quality video or audio that appear to show real candidates, but which were actually created by machines.
  • 4
  • Today, AI systems can quickly produce fake voices, images and videos in seconds. And they can be created for little or no cost to the users. The tools, described as "generative AI," have received wide attention since the launch of ChatGPT late last year.
  • 5
  • Technology experts say these kinds of AI tools can be used to invent false messages in a series of different forms. The material can then be published to social media services or fake online news websites. Such material can be effective because it can target specific groups of people.
  • 6
  • "We're not prepared for this," said A.J. Nash, vice president of intelligence at the internet security company ZeroFox. "To me, the big leap forward is the audio and video capabilities that have emerged," Nash told The Associated Press.
  • 7
  • AI experts have identified possible situations in which AI could be used to trick the voting public. Among these are AI generated phone messages that seem to be in a candidate's own voice. Such messages could misinform voters about the correct voting date.
  • 8
  • Fake voice recordings could also be created to show a candidate admitting to a crime or expressing racist opinions. Video could also appear to show candidates giving a speech or comments they never gave. False images could be designed to look like local news reports that claim a candidate dropped out of the race.
  • 9
  • Former President Donald Trump, who is running for president in 2024, has shared AI-generated content with his followers. Recently, Trump shared a manipulated video of CNN news presenter Anderson Cooper on his Truth Social service.
  • 10
  • Another example of what AI tools can do is a campaign advertisement released last month by the Republican National Committee (RNC). The online ad came after Democratic President Joe Biden officially announced his reelection campaign.
  • 11
  • The ad began with an image of Biden and the words, "What if the weakest president we've ever had was re-elected?" The ad then showed a series of AI-created images. These included Taiwan under attack, closed businesses in the U.S. and soldiers and military vehicles driving down streets.
  • 12
  • RNC officials confirmed the use of AI in the ad. But other individuals or groups will likely develop and release such material in secret, said Petko Stoyanov. He is chief technology officer at Forcepoint, an internet security company in Austin, Texas. Stoyanov predicted groups aiming to interfere with U.S. democracy will employ AI tools and fake media material to reduce the public's trust.
  • 13
  • He gave the example of international organizations, overseas criminals or states using AI tools to impersonate an American politician. In such cases, Stoyanov said, he is unsure U.S. officials or law enforcement could do anything about it.
  • 14
  • Legislation has been introduced in the U.S. House of Representatives to require candidates to identify campaign ads created with AI. Some states have offered their own proposals aimed at reducing the use of fake AI-generated content.
  • 15
  • AI is not necessarily needed, however, to create false or misleading audio, video or written materials.
  • 16
  • However, New York Representative Yvette Clarke said her biggest fear is that AI tools could be used before the 2024 election to create material that incites violence and turns Americans against each other.
  • 17
  • "It's important that we keep up with the technology," Clarke told the AP. "People are busy with their lives, and they don't have the time to check every piece of information. AI being weaponized, in a political season, it could be extremely disruptive," she said.
  • 18
  • I'm Bryan Lynn.
  • 19
  • Bryan Lynn wrote this story for VOA Learning English, based on reports from The Associated Press.
  • 20
  • ________________________________________________________________
  • 21
  • Words in This Story
  • 22
  • leap - n. a big jump
  • 23
  • emerge - v. to become known
  • 24
  • manipulate - v. to control someone in a way that makes them do what you want to do
  • 25
  • impersonate - v. to copy the way someone looks and behave
  • 26
  • disrupt - v. to block something and stop it from continuing as it should
  • 27
  • ___________________________________________________________________
  • 28
  • What do you think of this story? We want to hear from you. We have a new comment system. Here is how it works: